Updates for Nonlinear Discriminants
نویسندگان
چکیده
A novel training algorithm for nonlinear discriminants for classification and regression in Reproducing Kernel Hilbert Spaces (RKHSs) is presented. It is shown how the overdetermined linear leastsquares-problem in the corresponding RKHS may be solved within a greedy forward selection scheme by updating the pseudoinverse in an order-recursive way. The described construction of the pseudoinverse gives rise to an update of the orthogonal decomposition of the reduced Gram matrix in linear time. Regularization in the spirit of Ridge regression may then easily be applied in the orthogonal space. Various experiments for both classification and regression are performed to show the competitiveness of the proposed method.
منابع مشابه
Boosted Dyadic Kernel Discriminants
We introduce a novel learning algorithm for binary classification with hyperplane discriminants based on pairs of training points from opposite classes (dyadic hypercuts). This algorithm is further extended to nonlinear discriminants using kernel functions satisfying Mercer’s conditions. An ensemble of simple dyadic hypercuts is learned incrementally by means of a confidence-rated version of Ad...
متن کاملSuperlinearly convergent exact penalty projected structured Hessian updating schemes for constrained nonlinear least squares: asymptotic analysis
We present a structured algorithm for solving constrained nonlinear least squares problems, and establish its local two-step Q-superlinear convergence. The approach is based on an adaptive structured scheme due to Mahdavi-Amiri and Bartels of the exact penalty method of Coleman and Conn for nonlinearly constrained optimization problems. The structured adaptation also makes use of the ideas of N...
متن کاملClassification and Reductio-ad-Absurdurn
Proofs for the optimality of classification in real-world machine learning situations are constructed. The validity of each proof requires reasoning about the probability of certain subsets of feature vectors. It is shown that linear discriminants classify by making the least demanding assumptions on the values of these probabilities. This enables measuring the confidence of classification by l...
متن کاملPii: S0031-3203(01)00107-8
This paper introduces a novel nonlinear extension of Fisher’s classical linear discriminant analysis (FDA) known as high-order Fisher’s discriminant analysis (HOFDA). The ability of the new method to capture nonlinear relationships stems from its use of an extended polynomial space constructed out of the original features. Furthermore, a genetic algorithm (GA) is used in order to incrementally ...
متن کاملOn Associated Discriminants 3
In this note we introduce a family i ; i = 0; : : : ; n ? 2 of discriminants in the space P n of polynomials of degree n in one variable and study some of their algebraic and topological properties following Ar]-Va] and GKZ]. The discriminant i consists of all polynomials p such that some nontrivial linear combination 0 p+ 1 p 0 + + i p (i) has a zero of multiplicity greater or equal i+2. In pa...
متن کامل